46,721 research outputs found

    Interview with The University of Manchester Faculty e-learning Managers conducted by Graham McElearney for ALT News Online, Issue 18, November 2009.

    Get PDF
    Graham McElearney conducted an interview with the four Faculty e-learning Managers at The University of Manchester. This document is the full transcript of the interview. The discussion includes e-learning strategy, organisational structure, current choices of tools and the future of the institutional VLE

    An outdoor spatially-aware audio playback platform exemplified by a virtual zoo

    Get PDF
    Outlined in this short paper is a framework for the construction of outdoor location-and direction-aware audio applications along with an example application to showcase the strengths of the framework and to demonstrate how it works. Although there has been previous work in this area which has concentrated on the spatial presentation of sound through wireless headphones, typically such sounds are presented as though originating from specific, defined spatial locations within a 3D environment. Allowing a user to move freely within this space and adjusting the sound dynamically as we do here, further enhances the perceived reality of the virtual environment. Techniques to realise this are implemented by the real-time adjustment of the presented 2 channels of audio to the headphones, using readings of the user's head orientation and location which in turn are made possible by sensors mounted upon the headphones. Aside from proof of concept indoor applications, more user-responsive applications of spatial audio delivery have not been prototyped or explored. In this paper we present an audio-spatial presentation platform along with a primary demonstration application for an outdoor environment which we call a {\em virtual audio zoo}. This application explores our techniques to further improve the realism of the audio-spatial environments we can create, and to assess what types of future application are possible

    Spatially augmented audio delivery: applications of spatial sound awareness in sensor-equipped indoor environments

    Get PDF
    Current mainstream audio playback paradigms do not take any account of a user's physical location or orientation in the delivery of audio through headphones or speakers. Thus audio is usually presented as a static perception whereby it is naturally a dynamic 3D phenomenon audio environment. It fails to take advantage of our innate psycho-acoustical perception that we have of sound source locations around us. Described in this paper is an operational platform which we have built to augment the sound from a generic set of wireless headphones. We do this in a way that overcomes the spatial awareness limitation of audio playback in indoor 3D environments which are both location-aware and sensor-equipped. This platform provides access to an audio-spatial presentation modality which by its nature lends itself to numerous cross-dissiplinary applications. In the paper we present the platform and two demonstration applications

    Optimising the number of channels in EEG-augmented image search

    Get PDF
    Recent proof-of-concept research has appeared showing the applicability of Brain Computer Interface (BCI) technology in combination with the human visual system, to classify images. The basic premise here is that images that arouse a participant’s attention generate a detectable response in their brainwaves, measurable using an electroencephalograph (EEG). When a participant is given a target class of images to search for, each image belonging to that target class presented within a stream of images should elicit a distinctly detectable neural response. Previous work in this domain has primarily focused on validating the technique on proof of concept image sets that demonstrate desired properties and on examining the capabilities of the technique at various image presentation speeds. In this paper we expand on this by examining the capability of the technique when using a reduced number of channels in the EEG, and its impact on the detection accuracy

    Eye fixation related potentials in a target search task

    Get PDF
    Typically BCI (Brain Computer Interfaces) are found in rehabilitative or restorative applications, often allowing users a medium of communication that is otherwise unavailable through conventional means. Recently, however, there is growing interest in using BCI to assist users in searching for images. A class of neural signals often leveraged in common BCI paradigms are ERPs (Event Related Potentials), which are present in the EEG (Electroencephalograph) signals from users in response to various sensory events. One such ERP is the P300, and is typically elicited in an oddball experiment where a subject’s attention is orientated towards a deviant stimulus among a stream of presented images. It has been shown that these types of neural responses can be used to drive an image search or labeling task, where we can rank images by examining the presence of such ERP signals in response to the display of images. To date, systems like these have been demonstrated when presenting sequences of images containing targets at up to 10Hz, however, the target images in these tasks do not necessitate any kind of eye movement for their detection because the targets in the images are quite salient. In this paper we analyse the presence of discriminating signals when they are offset to the time of eye fixations in a visual search task where detection of target images does require eye fixations

    Inspection of transparent surfaces using photosensitive paper

    Get PDF
    Window surface is laid flat on top of photosensitive paper. Opposite side of glass is covered by black cloth. Window edges are then illuminated by light flash through fiber optics. Exposed paper is processed and inspected. Paper shows scratches, bubbles, dust particles, and fingerprints on glass surface
    corecore